648 research outputs found

    On best rank one approximation of tensors

    Get PDF
    In this paper we suggest a new algorithm for the computation of a best rank one approximation of tensors, called alternating singular value decomposition. This method is based on the computation of maximal singular values and the corresponding singular vectors of matrices. We also introduce a modification for this method and the alternating least squares method, which ensures that alternating iterations will always converge to a semi-maximal point. (A critical point in several vector variables is semi-maximal if it is maximal with respect to each vector variable, while other vector variables are kept fixed.) We present several numerical examples that illustrate the computational performance of the new method in comparison to the alternating least square method.Comment: 17 pages and 6 figure

    Fast truncation of mode ranks for bilinear tensor operations

    Full text link
    We propose a fast algorithm for mode rank truncation of the result of a bilinear operation on 3-tensors given in the Tucker or canonical form. If the arguments and the result have mode sizes n and mode ranks r, the computation costs O(nr3+r4)O(nr^3 + r^4). The algorithm is based on the cross approximation of Gram matrices, and the accuracy of the resulted Tucker approximation is limited by square root of machine precision.Comment: 9 pages, 2 tables. Submitted to Numerical Linear Algebra and Applications, special edition for ICSMT conference, Hong Kong, January 201
    • …
    corecore